Goto

Collaborating Authors

 graph isomorphism testing


On the equivalence between graph isomorphism testing and function approximation with GNNs

Neural Information Processing Systems

Graph neural networks (GNNs) have achieved lots of success on graph-structured data. In light of this, there has been increasing interest in studying their representation power. One line of work focuses on the universal approximation of permutation-invariant functions by certain classes of GNNs, and another demonstrates the limitation of GNNs via graph isomorphism tests. Our work connects these two perspectives and proves their equivalence. We further develop a framework of the representation power of GNNs with the language of sigma-algebra, which incorporates both viewpoints. Using this framework, we compare the expressive power of different classes of GNNs as well as other methods on graphs. In particular, we prove that order-2 Graph G-invariant networks fail to distinguish non-isomorphic regular graphs with the same degree. We then extend them to a new architecture, Ring-GNN, which succeeds in distinguishing these graphs as well as for tasks on real-world datasets.


A Novel Sliced Fused Gromov-Wasserstein Distance

Piening, Moritz, Beinert, Robert

arXiv.org Artificial Intelligence

The Gromov-Wasserstein (GW) distance and its fused extension (FGW) are powerful tools for comparing heterogeneous data. Their computation is, however, challenging since both distances are based on non-convex, quadratic optimal transport (OT) problems. Leveraging 1D OT, a sliced version of GW has been proposed to lower the computational burden. Unfortunately, this sliced version is restricted to Euclidean geometry and loses invariance to isometries, strongly limiting its application in practice. To overcome these issues, we propose a novel slicing technique for GW as well as for FGW that is based on an appropriate lower bound, hierarchical OT, and suitable quadrature rules for the underlying 1D OT problems. Our novel sliced FGW significantly reduces the numerical effort while remaining invariant to isometric transformations and allowing the comparison of arbitrary geometries. We show that our new distance actually defines a pseudo-metric for structured spaces that bounds FGW from below and study its interpolation properties between sliced Wasserstein and GW . Since we avoid the underlying quadratic program, our sliced distance is numerically more robust and reliable than the original GW and FGW distance; especially in the context of shape retrieval and graph isomorphism testing.



Reviews: On the equivalence between graph isomorphism testing and function approximation with GNNs

Neural Information Processing Systems

The paper targets the problem of measuring the representation power of Graph neural networks (GNNs), an interesting and important topic, that has become popular recently (partially due to two prominent works (Xu et al. There are three main contributions: 1. Establishing the equivalence between two methods for measuring GNN representation power: (i) their ability to approximate permutation invariant functions (ii) their ability to distinguish non-isomorphic graphs. Although not very surprising, this is a nice observation. The authors show that these sigma algebras are an equivalent way to measure representation power of GNNs, for instance, the inclusion of sigma algebras originating from two models is equivalent to saying one model is more powerful than the other. This is a potentially useful observation.


Reviews: On the equivalence between graph isomorphism testing and function approximation with GNNs

Neural Information Processing Systems

This paper leverages the graph isomorphism problem to study the expressive power of GNNs. In addition, a measure of expressiveness is formalized using sigma-algebras and the authors propose a novel variant of GNN, RING-GNN, that is evaluated in an experimental study where it shows competitive results. The reviewers agree that this is a nice contribution, the theoretical results are interesting (though somehow expected) and that the proposed extension of G-invariant networks is relevant. However, all reviewers agree that the experimental comparison with RING-GNN-SVD is unfair and MUST BE REMOVED in a published version of the paper (that is removing the last line from table 1). One of the reviewer also note that a comparison with LanczosNet should be included (though the lack of comparison is not ground for rejection).


On the equivalence between graph isomorphism testing and function approximation with GNNs

Neural Information Processing Systems

Graph neural networks (GNNs) have achieved lots of success on graph-structured data. In light of this, there has been increasing interest in studying their representation power. One line of work focuses on the universal approximation of permutation-invariant functions by certain classes of GNNs, and another demonstrates the limitation of GNNs via graph isomorphism tests. Our work connects these two perspectives and proves their equivalence. We further develop a framework of the representation power of GNNs with the language of sigma-algebra, which incorporates both viewpoints.


On the equivalence between graph isomorphism testing and function approximation with GNNs

Chen, Zhengdao, Villar, Soledad, Chen, Lei, Bruna, Joan

Neural Information Processing Systems

Graph neural networks (GNNs) have achieved lots of success on graph-structured data. In light of this, there has been increasing interest in studying their representation power. One line of work focuses on the universal approximation of permutation-invariant functions by certain classes of GNNs, and another demonstrates the limitation of GNNs via graph isomorphism tests. Our work connects these two perspectives and proves their equivalence. We further develop a framework of the representation power of GNNs with the language of sigma-algebra, which incorporates both viewpoints.


On the equivalence between graph isomorphism testing and function approximation with GNNs

Chen, Zhengdao, Villar, Soledad, Chen, Lei, Bruna, Joan

arXiv.org Machine Learning

Graph neural networks (GNNs) have achieved lots of success on graph-structured data. In the light of this, there has been increasing interest in studying their representation power. One line of work focuses on the universal approximation of permutation-invariant functions by certain classes of GNNs, and another demonstrates the limitation of GNNs via graph isomorphism tests. Our work connects these two perspectives and proves their equivalence. We further develop a framework of the representation power of GNNs with the language of sigma-algebra, which incorporates both viewpoints. Using this framework, we compare the expressive power of different classes of GNNs as well as other methods on graphs. In particular, we prove that order-2 Graph G-invariant networks fail to distinguish non-isomorphic regular graphs with the same degree. We then extend them to a new architecture, Ring-GNNs, which succeeds on distinguishing these graphs and provides improvements on real-world social network datasets.